Okay, so this is the whole story why we're doing this course every semester simply because
the demand is so high.
And now let's go and directly dig into deep learning.
I mean, you have heard about this before and you guys are here for some reason because
we want to learn about this technology that has been around for only quite a few time
but is extremely popular right now.
And let me introduce the team.
So who are we?
So that's me, Andreas Meyer.
And then we have a lot of supporters in order actually to get this entire lecture running
and really repeat it.
So Tobias, Vincent is missing.
So Vincent is no longer here.
Vincent and Katharina did a tremendous job in creating the first set of slides.
Then Leonid and Nishant also were jumping in and were helping.
Also Matis, is this the first time that you're involved in this lecture?
And Sebastian will get involved in the supervision of the exercises.
So obviously if we demand a lot from you, then we also have to give a lot of opportunity
to the supervision to be able to answer questions.
And obviously Florian Tham and Felix Densinger are two students that took the class themselves
and we didn't scare them off enough so they actually decided to work with us.
And they will also support the programming exercises.
So don't worry.
It's doable.
It's going to be doable, but it's going to be a lot of work.
Good.
So there's a lot of buzzwords around in deep learning.
There's supervised, unsupervised.
There's obviously neural networks that we will discuss about.
And then there is all these big buzzwords, big data, artificial intelligence, machine
learning, representation learning.
Well, we will talk about representation learning and obviously machine learning.
You could argue that this is all artificial intelligence.
A couple of years ago, nobody would say he's doing artificial intelligence because five
years ago everybody knew artificial intelligence doesn't work.
So nobody would even use the term because obviously you want to do stuff that works.
Now artificial intelligence, the term is back.
Everybody is doing AI now.
And it's a huge catchphrase.
If you're interested in attracting money from venture capitalists and so on, you do AI now.
You do deep learning, you do AI.
You don't really have to understand what you're doing.
You just have to attract a lot of money and then build your company and sell the stuff.
It doesn't matter whether it works or not.
So these are the big buzzwords.
And in the end, then what we are really interested in doing is probably machine learning,
representation learning.
And this boils them down to tasks that actually consist of solving classification, segmentation,
regression, and generation problems.
So this is the things that we will actually be doing.
Presenters
Zugänglich über
Offener Zugang
Dauer
01:27:00 Min
Aufnahmedatum
2018-10-16
Hochgeladen am
2018-10-16 21:19:04
Sprache
en-US
Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
-
(multilayer) perceptron, backpropagation, fully connected neural networks
-
loss functions and optimization strategies
-
convolutional neural networks (CNNs)
-
activation functions
-
regularization strategies
-
common practices for training and evaluating neural networks
-
visualization of networks and results
-
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
-
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
-
deep reinforcement learning
-
unsupervised learning (autoencoder, RBM, DBM, VAE)
-
generative adversarial networks (GANs)
-
weakly supervised learning
-
applications of deep learning (segmentation, object detection, speech recognition, ...)